Development of Technical ASSISTANCE MANUAL for Energy Simulation Tool Approval

 

Conference Call No. 2

Draft Meeting Minutes

 

October 17, 2011

1:00 P.M. – 3:00 PM.

Department of Community Affairs
Building Codes and Standards

2555 Shumard Oak Boulevard
Tallahassee, Florida 32399-2100
(850) 487-1824

Meeting Purpose:

As part of the new 2010 Florida Energy Code, the Commission will be charged with the responsibility of approving energy simulation tools (tool).

 

Under the proposed service, the Contractor, JM Jadu Corp, will develop the procedure (aka Manual) for reviewing and validating the tools. 

Meeting Objectives:

Review the first draft of the technical assistance manual.

 

Meeting commences with roll call and confirmation of quorum of the workgroup members. 

The agenda for the current meeting and minutes of first meeting was approved without any changes.  Mr. Madani explained that the name of the manual is now called the “Technical Assistance Manual” and is available for vendor to use to submit their software tool.

 

Using a comments matrix emailed and available on the website, www.floridabuilding.org, Mr. Jadunandan reviewed the comments which came in from primarily from two groups.  The first comments concerned the term “equivalent results” to “similar results.”  It was concluded that compliance software either pass or fail the tests.  It was agreed to stricken the entire sentence.  The comments regarding time value dependent testing results were also stricken from the manual.  It was agreed that submitting content about failed test results is not applicable.  It was concluded that compliance software either pass or fail the tests.

 

The next comment is with respect to the type of software testing used in this manual.  The testing is based on inter-model comparisons and is one portion of an overall validation methodology that was first developed by NREL in 1983 (Judkoff et al 1983). The draft listed the three types of evaluation in the overall methodology.  This manual uses the comparative testing method only.  It was agreed that only this information is necessary in the manual. The comparative testing is when a program is compared to itself or to other programs.

 

Mr. Jadunandan informed the workgroup that the detailed testing items 5 thru 9 will be removed.  On test item number 3, the only data available for Florida is the Florida Hers Best Test results.   Only the original weather file and complete coherent package for Orlando is available and is the choice of the workgroup.  Las Vegas and Colorado Springs was suggested as a possible additional comparison results.  Las Vegas may be used for heating and Colorado Springs cooling.  Reference results for Las Vegas and Colorado Springs are not provided by the State.

 

There was some concern that only one data set is available.  Mr. Jadunandan informed the group currently the State is will not conduct any new tests to generate new reference results.  Mr. Neymark informed the group that most tools have ran the Orlando, Las Vegas, and Colorado Springs to confirm to Resnet requirements.

It was agreed that all three locations will be required by all vendors.

 

The other comment was regarding tier1 and tier2 testing.  It was agreed that only tier1 is necessary.  Tier2 is primarily passive solar testing and building orientations.  It was agreed that tier2 testing be stricken from the manual.

 

There is a new set of reference of ASHRAE 140-2011 may be available.  The group agreed to use the ASHRAE 140-2007 as the reference material instead of ASHRAE 140-2004.

 

Mr. Neymark explains the reasons to use logical explanation if a vendor results falls just outside of the acceptable range.  The discussion continues about COMNET references and auto generation testing.   Mr. Swami express concern about auto-generation and the Mr. Madani states that the Vendors and State will continue to work on testing cases and results for the future revision of the manual.  Mr. Swami explains that testing is available for performance testing, but not for the generating the code-compliant reference design.  The State agrees that currently there is no auto-generation test, but will continue to encourage the vendors to develop these test.

 

It was suggested that the State provide a design test case and have the Vendor provide an auto-generated reference design.  The State can then manually checked the results to ensure that the reference design is generated correctly.  All agreed that is it an important area for the State to fund and continue working on.

 

It was agreed that the Vendor shall include a statement that the software is capable of auto-generating the reference design using only the user input of the proposed design.  This requirement is stated in the building code.

 

The next comment was regarding the change in the formula in Appendix A from N-1 to N.  Mr. Neymark explained the reasoning for changing the formula and how it was modified to reflect the same results.  Further discussions about his included the possibility that the reference results will longer be applicable.  However, since the Vendors are no longer generating reference results for other cities, the entire reference is not necessary.  It was agreed to remove appendix A for the draft since the vendor is only using data previously generated.

 

Mr. Jadunandan state that appendix B is informational and all agreed that it should be reference only.  Appendix B will be deleted.  Both appendixes will be replaces with templates of Forms 405, 506 and EPL Card.

 

The comment regarding Forms (405, 506 and EPL) referenced in the Florida Building Code, Energy Gauge.  It was agreed to use FSEC generated forms as templates.

 

The comment was made concerning the software vendor releasing their algorithm.  No define conclusion was reached on if the Vendor should release their algorithm.

 

Concerns were raised about how the State will handle code changes and compliance tool approval.  It was noted that the manual already addresses these types of changes.

 

The question about Public Notification of issues and actions was asked.  Mr. Madani said that this done by case by case situation and thru public proceedings via the Commissions.

 

Question about who is doing the review of the application.  Mr. Madani answered that staff and general public have access to the vendor certification.  The process is self-certification.  Mr. Philip suggested it’s more of a vendor-testing.  Mr. Madani state that staff will do spot check of the application.  Vendor certification is open to peer review.

 

Mr. Philip questions why the software is responsible of sizing the HVAC (Section 5.1 2nd bullets).  It was agreed to stricken this bullet item.  It is required by code, but not needed as a requirement in the compliance software.  This process is done by the HVAC contractor and plan reviewers.  The only potential issue is some “gaming the system” and some plan reviewer may not catch it.  May have text that suggests that unmet hours must be met. Some suggestions were 0 or 1%. No definite replacement text/number, if any, was agreed upon. 

 

Concern was raised about some approved software generating slightly different results on a similar design.  The users may end up purchasing all approved compliance software tool and use the one which generate the desire results.  The idea of having algorithms available may provide a solution and question of who is approving the application and software.  Mr. Madani suggested that the State will deal with the issues as they arise.


 

 

Florida Department of Business and Professional Regulation

FLORIDA BUILDING CODE AND STANDARDS

 

Review 1st Draft of the Technical Assistance Manual for Energy Simulation Tool Approval

 

Summary of Comments and Proposed Changes Preview

 

Post Workgroup meeting of October 17, 2011

AM: Approved as amended

Page #

Section

Comments

Proposed Changes

Remarks

6

2

Joel Neymark (JN)  comment: The approval tests minimize differences in interpretation by providing explicit detailed descriptions of the test buildings that must be analyzed. For differences in the compliance software's algorithms, the Commission allows algorithms that yield equivalent results to those of the reference programs that are identified for each designated test suite.

The approval tests minimize differences in interpretation by providing explicit detailed descriptions of the test buildings that must be analyzed. For differences in the compliance software's algorithms, the Commission allows algorithms that yield similar results to those of the reference programs that are identified for each designated test suite.

See underline text

 

Oct. 17, 2011

WG: AM - remove text marked in red.

19

5.1

JN comment: For tests that DO NOT COMPLY, the vendor shall supply diagnostic output that indicates noncompliance and gives the TDV [define TDV] energy information needed to evaluate the test criteria

For tests that DO NOT COMPLY, the vendor shall supply diagnostic output that indicates noncompliance and gives the time dependent valuation energy information needed to evaluate the test criteria

See underline text

 

Oct. 17, 2011

WG: AM – delete paragraph as noted in red.

 

 

 

 

21

6.1

JN comment: the bulletined material is updated in Annex B23 of 140-2007 Addendum C or 140-2011 will have same – ASHRAE should publish these later this month, but we can figure out a way to get you internal version if you need them sooner. – or otherwise use the updated bullets from Judkoff and Neymark 2006]

·         Analytical Verification – in which the output from a program, subroutine, or algorithm is compared to the results from a known analytical solution for isolated heat transfer mechanism under simple boundary conditions

·         Empirical Validation – in which calculation results from a program, subroutine, or algorithm, is compared to monitored data from a real structure, test cell, or laboratory experiment

·         Comparative testing – in which a program is compared to itself or to other programs.  The comparative approach included “sensitivity testing” and “intermodal comparisons.”

 

Replace bulletined text with text depicted below:

Empirical Validation—in which calculated results from a program, subroutine, algorithm, or software object are compared to monitored data from a real building, test cell, or laboratory experiment.

Analytical Verification—in which outputs from a program, subroutine, algorithm, or software object are compared to results from a known analytical solution or a generally accepted numerical method for isolated heat transfer under very simple, highly constrained boundary conditions.

Comparative Testing—in which a program is compared to itself or to other programs.

 

updated bullets from Judkoff and Neymark 2006

 

Oct. 17, 2011

WG:  AM – restructure section so that validation analysis is limited to “comparative testing”.

23

 

JN comment: edits per Barnaby, Fairey, Judkoff, Neymark emails of 10/4 and 10/5. Anything other than running Florida HERS BESTEST for only the Orlando climate is not workable, unless the Florida Commission wants to fund a research project to generate new reference results for other climates],

 

Bill Wright:

 

 The intent of this section is reasonable – to show that the proposed compliance software complies with some accepted standard. Items 1) and 2) on page 22 make use of work described in the reference document (from 1997 work by NREL) on page 22. Unfortunately, the Tier 1 tests are the only ones we can find that any software vendors have compared to. Tier 2 tests are unnecessarily extreme, apply to a very small group in the market, and require a scope of effort out of proportion to the existing or likely future market. Tier 1 tests in Orlando will achieve the intent of this procedure.

 

 

 Item 3) asks vendors to use Appendix A in the Manual to “calculate the acceptable range”. This is an unnecessary complexity to burden vendors when the State should provide the acceptable range.

 

Unfortunately, at least one of the three programs are no longer available and two of the three are longer supported. In addition, TMY3 data could be used for the 9 cities other than Orlando, but original data are not available. In addition, asking vendors to compute acceptable ranges for 9 cities multiplied by all the cases required for Tier 1 has the same problems discussed in the preceding section discussing item 3). Any criteria for acceptance should be supplied by the State, not generated by vendors.

 

Items 5) through 9) on Page 23 are unworkable. They should be dropped entirely. We contacted the authors of the reference document and asked if the software to perform these tests as described on page 23 was available, and here is what Ron Judkoff replied in an email to us and Mo Madani:

 

 

Ron Judkoff:

 In August of 1997, NREL published “Home Energy Rating System Building Energy Simulation Test for Florida (Florida-HERS BESTEST)” in two volumes. Volume 1 was a User’s Manual and Volume 2 contained Reference Results. The reason for publishing this separate version of HERS BESTEST was that Phil Fairey was concerned that we did not emphasize conditions and issues important in the Florida climate enough in the original HERS BESTEST. I lost track of what the state of Florida did with this after it was published, but the requirements for the FY10 Florida code are misguided, inappropriate, and probably impossible to comply with. I don’t know what the immediate remedy is here, but I am happy to weigh in on what a more reasonable requirement should look like.

 

Delete “Detail Testing Procedures”

 

The State of Florida position is not to provide a range or additional reference results at this time.  This burden remains on the Vendor.

 

The following shall be deleted from the manual

 

Detail Testing Procedures

5)      Determine weather  data for the following ten cities (use City Hall as reference address):

Pensacola, Tallahassee, Jacksonville, Gainesville, Orlando, West Palm Beach, Miami, Key West, Naples and Tampa

 

6)      Substitute the values in Table 2-1

 

7)      Run cases in the following Programs identified in the reference document

a.       Blast 3.0 Level 215

b.      DOE 2.1 E-W54

c.       Serires/Suncode 5.7

 

8)      Repeat the test cases in the proposed compliance software program

9)      Determine the pass/fail ranges  as specified in Appendix A

 

Discussion on Orlando, Las Vegas or Colorado data

 

Availability of reference programs

 

a.       Blast 3.0 Level 215

b.      DOE 2.1 E-W54

c.       Serires/Suncode 5.7

Oct. 17, 2011

WG: for testing use Orlando and other cities as applicable (i.e. Las Vegas and Colorado)

 

Use Tier 1 and delete the reference to Tier 2 – more applicable to passive solar and not applicable to Florida.

 

Delete “detail testing procedures” as noted in column 2.

29

7.1

[JN comment:

Heading General Requirements

And Scope has no text. 

Move text depicted below from

Heading “Calculation software tools” to General Requirements

 

Calculation procedures used to comply with this section shall be only compliance software tools approved by the Florida Building Commission to be capable of calculating the annual energy consumption of all building elements that differ between the standard reference design and the proposed design and shall include the following capabilities.

 

1.       Computer generation of the standard reference design using only the input for the proposed design. The calculation procedure shall not allow the user to directly modify the building component characteristics of the standard reference design.

2.       Building operation for a full calendar year (8760 hours).

3.       Climate data for a full calendar year (8760 hours) and shall reflect approved coincident hourly data for temperature, solar radiation, humidity and wind speed for the building location.

4.       Ten or more thermal zones.

5.       Thermal mass effects.

6.       Hourly variations in occupancy, illumination, receptacle loads, thermostat settings, mechanical ventilation, HVAC equipment availability, service hot water usage and any process loads.

7.       Part-load performance curves for mechanical equipment.

8.       Capacity and efficiency correction curves for mechanical heating and cooling equipment.

9.       Printed code official inspection checklist listing each of the proposed design component characteristics from Table 506.5.1(1) determined by the analysis to provide compliance, along with their respective performance ratings (e.g., R-value, U-factor, SHGC, HSPF, AFUE, SEER, EF, etc.).

 

Will add section number s to remaining headings for additional clarity

 

Oct. 17, 2011

WG:  agree - add section numbers to remaining headings for additional clarity.

35

7.2

ASHRAE Standard 140-2007 Tests

 

JN comment: 140-2011 should be published by ASHRAE in late October – consider updating to that, and if yes, cite “140-2011 Class I tests of Section 5”]

 

Acceptable

Need copy of 140-2011

 

Oct. 17, 2011

WG:  Use 140-2007 for consistency with the FBC, Energy Conservation.

35

7.2

JN comment:

COMNET compliant software is required to perform the ASHRAE Standard 140-2007 suite of software tests and the results of these tests shall conform to the COMNET acceptance requirements [provide the COMNET document reference]. All tests shall be completed in accordance with the requirements of ASHRAE Standard 140-2007. The resulting estimates of energy consumption shall fall between the minimum and maximum values established by COMNET, unless a logical explanation is provided using the standard output report block for “anomalous results” provided with Standard 140-2007 Addendum a [this will also be included in 140-2011]. The portfolio folder for Appendix E contains spreadsheets wherein the software vendor enters the results of the Standard 140 simulations for comparison against the criteria. When results from candidate software fall outside the COMNET acceptance range or when candidate software is unable to perform one of the tests, the vendor shall provide an explanation of the reason as per ASHRAE Standard 140-2007 Addendum A [or -2011] requirements. The portfolio folder for Appendix E also contains a methodology paper that describes how the COMNET acceptance criteria were developed.

 

 

COMNET compliant software is required to perform the ASHRAE Standard 140-2007 suite of software tests and the results of these tests shall conform to the COMNET acceptance requirements [provide the COMNET document reference]. All tests shall be completed in accordance with the requirements of ASHRAE Standard 140-2007. The resulting estimates of energy consumption shall fall between the minimum and maximum values established by COMNET, unless a logical explanation is provided using the standard output report block for “anomalous results” provided with Standard 140-2007 Addendum a [this will also be included in 140-2011]. The portfolio folder for Appendix E contains spreadsheets wherein the software vendor enters the results of the Standard 140 simulations for comparison against the criteria. When results from candidate software fall outside the COMNET acceptance range or when candidate software is unable to perform one of the tests, the vendor shall provide an explanation of the reason as per ASHRAE Standard 140-2007 Addendum A [or -2011] requirements. The portfolio folder for Appendix E also contains a methodology paper that describes how the COMNET acceptance criteria were developed.

 

Insert COMMET document reference

 

Underlined text added

 

Update reference to depict Standard 140-2007 Addendum a

 

Or

 

140-2011].

 

Oct. 17, 2011

WG: for testing auto generation of the reference design use the criteria stated in Section B - 1.2 of the 2010 FBC, Energy Conservation. 

 

38

Appendix A

JN comment: Appendix A

Compare Annex B22 starting on p. 134 of attached pdf, which is the current ASHRAE pub galley (internal, not for distribution), versus your App A and make changes to appropriate sections of what you have to match the content changes (obviously formats, section #s, eqn #s, etc would remain in your context not ASHRAE's)

Ron Judkoff:

One comment….I would not refer to the change from N-1 to N in the “Acceptance Criteria” as a correction, but as a change. Also, I believe that when we made that change we also changed the “example” confidence interval criteria so that the target ranges remained the same.

 

JN comment:

I agree with you on the first note -- we changed to "N" per a comment by an industry professional, although one could develop a supporting argument to keep the "N-1" in the denominator.

 

And yes on the second note, that's what we did. And the changes to the confidence coefficients for Florida-HERS BESTEST should be the same as those for HERS BESTEST, so that there are no changes to the acceptance ranges of Florida-HERS BESTEST Volume 2, but someone on the Florida side should double check that.

 

Change the equations to reflect only "N" in both numerator and denominator.

Update Appendix A to 140-2011 informative annex  changes

 

Oct. 17, 2011

WG:  Take out Appendix A.  Not needed.

 

 

 

 

 

 

 

 

 

 

 

 

WG:  Okay as is “N-1”.

38

 

JN comment: Update this for forthcoming 140-2007 Addendum B (will also be included in 140-2011) – check with Steve Ferguson if 140-2007-B is posted; I know 140-2007-C is not yet posted, but B might be.]

 

Acceptable

Need copy of 140-2011

 

Oct. 17, 2011

WG:  Use 140 -2007 for consistency with the standard edition as referenced in the FBC, Energy Conservation.

42

 

JN comment:

Appendix B-delete

Appendix B is a hot hyperlink.  Please double click to launch.  This reference will remain in the manual.  Appendix B currently contain: Model Validation and Testing: The Methodological Foundation of ASHRAE Standard 140

Preprint R. Judkoff National Renewable Energy Laboratory J. Neymark J. Neymark & Associates Presented at the ASHRAE 2006 Annual Meeting Quebec City, Canada June 24–29, 2006

 

 

43

 

JN comment:

References

Judkoff and Neymark 2006 [enter in appropriate format using http://www.nrel.gov/docs/fy06osti/40360.pdf

Same content as

8.  Judkoff, R.; Neymark, J. (2006). Model Validation and Testing: The Methodological Foundation of ASHRAE Standard 140. ASHRAE Transactions: Papers Presented at the 2006 Annual Meeting, 24-28 June 2006, Quebec City, Canada. Atlanta, GA: American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (ASHRAE) Vol. 112, Pt. 2: pp. 367-376; NREL Report No. CP-550-41015. [JN comment: This looks like NREL Pubs may have a typo on the equivalent nrel report # as listed here?]

Add COMNET User’s Manual, ASHRAE 90.1,

Update references to reflect changes

Add COMMET User Manual, ASHRAE 90.1

Need working url or electronic copies of reference documents

 

Oct. 17,2011

WG:  this should only be included in the reference document list for informational purposes.

 

 

 

 

 

 

 

Comments from William Wright

Forms 405 and 506

 

 All forms required for submittal for permits (i.e. Form 405, 405, 506) should be shown in detail in the Manual or the Code. . It is unclear what data should be presented in these forms, without a provided template. Furthermore, allowing vendors to each have different formats will result in considerable frustration on the part of Inspectors and users

 

Add Forms to manual (create appendix C and D).  State currently does not have these forms as a given template.

 

Form for EPL Card may be useful also.

Need copies of Forms identified in the code.  The EPL Display Card may be useful to add also.

405.4.3 Additional documentation. The code official shall require the following documents:

1. An EPL Display Card signed by the builder providing the building component characteristics of the proposed design shall be provided to the purchaser of the home at time of title transfer.

 

Oct. 17, 2011

WG: add an appendix necessary to describe Forms 405 and 506.  Use Energy Gauge printout to develop templates for these forms.  Also, include a copy of the EPL Display Card in the appendix.  

 

 

 

 

Comments from William Wright

 

Table B2.2 from Normative Appendix B in the revised version of the Energy Code is referenced by the Manual, and requires knowledge contained in Energy Gauge (See section on Building Envelope, Standard Reference Design, Table B2.2). We assume these references to Energy Gauge should be removed.

 

State of FL states that the reference to Energy Guage was removed  from the current draft of the 2010 Florida Building Code, Energy Conservation

Need the latest draft of the Florida building code

 

Oct. 17, 2011

Note:  No action needed.  Staff stated that this has been corrected.

 

23

Comments from William Wright

 

Table 2-1 without accompanying references

Scrub entire draft to add references where necessary.

Oct. 17, 2011

 

WG:  Agree with the comment in column 2.

 

Utility rates for the commercial option are available for the FPSC.

 

 

Additional comments submitted during the meeting

 

 

9

 

1.5

 

Copy of the Compliance…….A machine readable copy……zones.

 

Oct. 17, 2011

WG: change “machine readable” to “pdf”.

 

Note: need to differentiate between the term “Energy Code” and “program code”.

19

5.1

Paragraph – “The compliance software shall automatically ….the standard reference design to establish the energy budget.

 

Oct. 17, 2011

WG:  delete entire paragraph.  This section is more applicable to equipment sizing.

20

2nd

2nd Paragraph – clarify “expected results”

 

 

 

Oct. 17, 2011

WG: clarify text by referring to “BESTEST”.